Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add filters

Database
Language
Document Type
Year range
1.
3rd International Conference for Emerging Technology, INCET 2022 ; 2022.
Article in English | Scopus | ID: covidwho-2018891

ABSTRACT

In the Covid-19 age, we are becoming increasingly reliant on virtual interactions like as Zoom and Google meetings / Teams chat. The videos received from live webcamera in virtual interactions become great source for researchers to understand the human emotions. Due to the numerous applications in human-computer interaction, analysis of emotion from facial expressions has piqued the interest of the newest research community (HCI). The primary objective of this study is to assess various emotions using unique facial expressions captured via a live web camera. Traditional approaches (Conventional FER) rely on manual feature extraction before classifying the emotional state, whereas Deep Learning, Convolutional Neural Networks, and Transfer Learning are now widely used for emotional classification due to their advanced feature extraction mechanisms from images. In this implementation, we will use the most advanced deep learning models, MTCNN and VGG-16, to extract features and classify seven distinct emotions based on their facial landmarks in live video. Using the FER2013 standard dataset, we achieved a maximum accuracy of 97.23 percent for training and 60.2 percent for validation for emotion classification. © 2022 IEEE.

SELECTION OF CITATIONS
SEARCH DETAIL